Probabilistic learning of indexed families under monotonicity constraints: hierarchy results and complexity aspects

نویسنده

  • Léa Meyer
چکیده

We are concerned with probabilistic identification of indexed families of uniformly recursive languages from positive data under monotonicity constraints. Thereby, we consider conservative, strong-monotonic and monotonic probabilistic learning of indexed families with respect to class comprising, class preserving and proper hypothesis spaces, and investigate the probabilistic hierarchies in these learning models. In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints, even if the probability is close to 1, provided the learning machines are restricted to proper or class preserving hypothesis spaces. In the class comprising case, each of the investigated probabilistic hierarchies has a threshold. In particular, we can show for class comprising conservative learning as well as for learning without additional constraints that probabilistic identification and team identification are equivalent. This yields discrete probabilistic hierarchies in these cases. In the second part of our work, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. We deal with the question how much additional information provided by oracles is sufficient and necessary for compensating the additional power of probabilistic learning machines. In this context, we introduce a complexity measure in order to measure the additional power of the probabilistic machines in qualitative terms. One main result is that for each oracle A ≤T K, there exists an indexed family LA which is properly conservatively identifiable with p = 1/2, and which exactly reflects the Turing degree of A, i.e., LA is properly conservatively identifiable by an oracle machine M [B] iff A ≤T B. However, not every indexed family which is conservatively identifiable with probability p = 1/2 reflects the Turing degree of an oracle. Hence, the conservative probabilistic learning classes are higher structured than the Turing degrees below K. Finally, we prove that there exist learning problems which are conservatively (monotonically) identifiable with probability p = 1/2 (p = 2/3), but conservatively (monotonically) identifiable only by oracle machines having access to T OT . Dekan: Prof. Dr. Gerald Urban Vorsitzender der Prüfungskommission: Prof. Dr. Thomas Ottmann Erstreferent: Prof. Dr. Britta Schinzel Zweitreferent: Prof. Dr. Thomas Zeugmann (Uni Lübeck) Datum der Disputation: 31.Mai 2001

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Guided Tour Across the Boundaries of Learning Recursive Languages

The present paper deals with the learnability of indexed families of uniformly recursive languages from positive data as well as from both, positive and negative data. We consider the in uence of various monotonicity constraints to the learning process, and provide a thorough study concerning the in uence of several parameters. In particular, we present examples pointing to typical problems and...

متن کامل

The Learnability of Recursive Languages in Dependence on the Hypothesis Space

We study the learnability of indexed families L = (L j ) j2IN of uniformly recursive languages under certain monotonicity constraints. Thereby we distinguish between exact learnability (L has to be learnt with respect to the space L of hypotheses), class preserving learning (L has to be inferred with respect to some space G of hypotheses having the same range as L), and class comprising inferen...

متن کامل

Learning Indexed Families of Recursive Languages from Positive Data

In the past 40 years, research on inductive inference has developed along different lines, concerning different formalizations of learning models and in particular of target concepts for learning. One common root of many of these is Gold’s model of identification in the limit. This model has been studied for learning recursive functions, recursively enumerable languages, and recursive languages...

متن کامل

Learning indexed families of recursive languages from positive data: A survey

In the past 40 years, research on inductive inference has developed along different lines, e.g., in the formalizations used, and in the classes of target concepts considered. One common root of many of these formalizations is Gold’s model of identification in the limit. This model has been studied for learning recursive functions, recursively enumerable languages, and recursive languages, refle...

متن کامل

Monotonicity, Activity and Sequential Linearization in Probabilistic Design Optimization

Design optimization under uncertainty is considered in the context of problems with probabilistic constraints. Probabilistic optimization has been studied for several years; in this dissertation, some theoretical developments in certain classes of deterministic optimization problems are extended to probabilistic ones. Specifically, the optimality conditions of probabilistic optimization and pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001